Forecasting High-Dimensional Covariance Matrices Using High-Dimensional Principal Component Analysis

نویسندگان

چکیده

We modify the recently proposed forecasting model of high-dimensional covariance matrices (HDCM) asset returns using principal component analysis (PCA). It is well-known that when sample size smaller than dimension, eigenvalues estimated by classical PCA have a bias. In particular, very small number are extremely large and they called spiked eigenvalues. High-dimensional gives which correct biases This situation also happens in financial field, especially situations where high-frequency data handled. The research aims to estimate HDCM for realized matrix Nikkei 225 data, it estimates 5- 10-min intraday asset-returns intervals. construct time-series models each PCA, forecast HDCM. Our simulation shows has better estimation performance estimating integrated matrix. our empirical analysis, we show will be able improve make portfolio with variance.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

High-dimensional Principal Component Analysis

High-dimensional Principal Component Analysis by Arash Ali Amini Doctor of Philosophy in Electrical Engineering University of California, Berkeley Associate Professor Martin Wainwright, Chair Advances in data acquisition and emergence of new sources of data, in recent years, have led to generation of massive datasets in many fields of science and engineering. These datasets are usually characte...

متن کامل

Regularized Estimation of High-dimensional Covariance Matrices

Regularized Estimation of High-dimensional Covariance Matrices

متن کامل

Shrinkage Estimators for High-Dimensional Covariance Matrices

As high-dimensional data becomes ubiquitous, standard estimators of the population covariance matrix become difficult to use. Specifically, in the case where the number of samples is small (large p small n) the sample covariance matrix is not positive definite. In this paper we explore some recent estimators of sample covariance matrices in the large p, small n setting namely, shrinkage estimat...

متن کامل

Tests for High-Dimensional Covariance Matrices

We propose tests for sphericity and identity of high-dimensional covariance matrices. The tests are nonparametric without assuming a specific parametric distribution for the data. They can accommodate situations where the data dimension is much larger than the sample size, namely the “large p, small n” situations. We demonstrate by both theoretical and empirical studies that the tests have good...

متن کامل

On principal component analysis for high-dimensional XCSR

XCSR is an accuracy-based learning classifier system which can handle classification problems with realvalue features. However, as the number of features increases, a high classification accuracy comes at the cost of more resources: larger population sizes and longer computational running times. In this paper we investigate PCA-XCSR (a sequential application of PCA and XCSR) in three environmen...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Axioms

سال: 2022

ISSN: ['2075-1680']

DOI: https://doi.org/10.3390/axioms11120692